Search Results: "eugene"

17 June 2012

Eugene V. Lyubimkin: rtmpdump: error code 32

If you like me spent hours searching what does the rtmpdump's error (when trying to download live streams)

Caught signal: 13, cleaning up, just a second...
, following by

ERROR: WriteN, RTMP send error 32


mean, then after browsing many public sources (and finding no evidence there), measuring the connection throughtput (thanks to iftop program) and trying different internet providers I can, with a high level of confidence guess that error means

"your (average) connection speed is slower than what server expects, and your client is enough behind the stream that server has no more data for you"

10 April 2012

Eugene V. Lyubimkin: Cupt bits

Half a year since the last status update, so here we go:

16 February 2012

Eugene V. Lyubimkin: bureaucratic programming

Suppose that you need to write an interface to a function which draws triangles. It could look like this: (the language is a C++-based pseudocode)

void rawDrawTriangle(size_t a, size_t b, size_t c)   ...  
bool isValidTriangle(size_t a, size_t b, size_t c)
 
  return (a + b > c) && (a + c > b) && (b + c > a);
 
void drawTriangle(size_t a, size_t b, size_t c)
 
  if (max(a, b, c) > MAX_LINE_LENGTH)
   
    throw("one of sides is too big");
   
  if (!isValidTriangle(a, b, c))
   
    throw("invalid triangle");
   
  rawDrawTriangle(a, b, c);
 
void userFunction()
 
  ...
  drawTriangle(3, 5, 7);
  ...
 


And the following is how a bureaucratic version of the code could look like:

class Certificate
 
  time_t getCreationTime() const   ...  
  Certificate()   ...  
 ;
class TriangleIsValidCertificate: public Certificate
 
 public:
  const size_t a;
  const size_t b;
  const size_t c;
 private:
  TriangleIsValidCertificate(size_t a, size_t b, size_t c)   ...  
 friend class TriangleCertificationAuthority;
 ;
class TriangleCertificationAuthority
 
  static TriangleIsValidCertificate getTriangleIsValidCertificate(size_t a, size_t b, size_t c)
   
    msleep(random() * 2);
    if ((a + b > c) && (a + c > b) && (b + c > a))
     
      if (a == b && a == c)
       
        msleep(random() * 10); // hm, suspicious query
       
      return TriangleIsValidCertificate(a, b, c);
     
    else
     
      throw("invalid triangle");
     
   
 ;
class LineIsDrawableCertificate: public Certificate
 
 public:
  const size_t length;
 private:
  LineIsDrawableCertificate(size_t l)   ...  
 friend class LineCertificationAuthority;
 ;
class LineCertificationAuthority
 
  static LineIsDrawableCertificate getLineIsDrawableCertificate(size_t length)
   
    msleep(rand());
    if (length <= MAX_LINE_LENGTH)
     
      return LineIsDrawableCertificate(length);
     
    else
     
      throw("the line is too long");
     
   
 ;
void drawTriangle(size_t a, size_t b, size_t c, LineIsDrawableCertificate lineCerts[3], TriangleIsValidCertificate tivCert)
 
  msleep(rand() * 5);
  if (lineCerts[0].length != a   lineCerts[1].length != b   lineCerts[2].length != c)
   
    throw("your application is rejected");
   
  if (tivCert.a != a   tivCert.b != b   tivCert.c != c)
   
    throw("your application is rejected");
   
  if (time() - tivCert.getCreationTime() > '60 milliseconds')
   
    throw("your application is rejected");
   
  msleep(rand());
  rawDrawTriangle(a, b, c);
 
void userFunction()
 
  ...
  size_t a = 3, b = 5, c = 7;
  auto aCert = LineCertificationAuthority::getLineIsDrawableCertificate(a);
  auto bCert = LineCertificationAuthority::getLineIsDrawableCertificate(b);
  auto cCert = LineCertificationAuthority::getLineIsDrawableCertificate(c);
  auto tviCert = TriangleCertificationAuthority::getTriangleIsValidCertificate(a, b, c);
  drawTriangle(a, b, c, array(aCert, bCert, cCert), tviCert);
  ...
 

11 February 2012

Eugene V. Lyubimkin: multiarch (and) hacks

I experienced situations of writing substantial amounts of code (feature branches) for hours, days and even weeks only to find later that the written code can be thrown to the bin, either because of hidden design problem(s) or too much negative implications outweighting the positive implications of the change.

After reading some recent multiarch threads on debian-devel@ and seeing what hacks are seriously being proposed to implement only to keep the thing going, I now think that the low-level part of the Debian multiarch implementation proposal is no less broken than its high-level part, and the whole proposal is a one big hack which requires and will require more subhacks. Some will benefit from the added functionality, but all, both maintainers and users will suffer from drawbacks.

How two paragraphs above are related? We still have the time to revert the changes and say "sorry, it was a nice idea but the software world isn't ready".

5 September 2011

Eugene V. Lyubimkin: cupt: 2.2.0~rc1

It took longer than usual, but I believe it's worth.

I just released Cupt 2.2.0~rc1 to Debian experimental. To experimental, because there are important changes and I would appreciate "in-field" testing before final 2.2.0.

Here are the major changes since 2.1.x:



Enjoy and send the bug reports.

16 April 2011

Eugene V. Lyubimkin: cupt: 2.0.0

I released new major version of Cupt, 2.0.0 (== 2.0.0~rc2), today. The final list of major changes since 1.5.x is available here and in binary packages. A web copy of the (newly written) tutorial is here.

Thanks to all bug reporters who helped to make it better.

Packages have landed to Debian unstable and should be available on the mirrors soon.

21 March 2011

Eugene V. Lyubimkin: on high-level dependencies in Debian MultiArch spec

Let's assume that all low-level and file-layout work is done, and we need specifications how to make Debian high-level package managers multi-arch aware.

What we have now: only one package of the same name can be installed in the system, and packages can declare dependencies only against packages of the same architecture.

What, hence, needs to be specified:
a) are certain packages of the same name but different arches co-installable or not;
b) allow to specify foreigh arches in dependencies.

Then, how our high-level multi-arch spec could be written?

a) new field 'MultiArchCoinstallable: yes' for co-installable packages;
b) new dependency syntax: package_name[:[!]arch[,arch]...], for example: 'perl' (assuming 'perl:native' for forward dependencies like 'Depends' or 'Recommends', and 'perl:any' for Conflicts and Breaks), 'perl:amd64', 'perl:amd64,i386', 'perl:!i386', 'perl:native', 'perl:any'.

That's all. Looks easy and mostly intuitive, doesn't it? However, some people are going to implement this instead.

6 March 2011

Eugene V. Lyubimkin: cupt: 2.0.0~beta1

Cupt v2 has reached a beta stage.

Main news of Cupt v2 comparing to Cupt v1 are grouped here.

Notable changes since 2.0.0~alpha3:



Full changelog is available in debian/changelog, as usual.

binary packages for i386, source package and README

21 January 2011

Rapha&#235;l Hertzog: People behind Debian: Michael Vogt, synaptic and APT developer

Michael and his daughter Marie

Michael has been around for more than 10 years and has always contributed to the APT software family. He s the author of the first real graphical interface to APT synaptic. Since then he created software-center as part of his work for Ubuntu. Being the most experienced APT developer, he s naturally the coordinator of the APT team. Check out what he has to say about APT s possible evolutions. My questions are in bold, the rest is by Michael. Who are you? My name is Michael Vogt, I m married and have two little daughters. We live in Germany (near to Trier) and I work for Canonical as a software developer. I joined Debian as a developer in early 2000 and started to contribute to Ubuntu in 2004. What s your biggest achievement within Debian or Ubuntu? I can not decide on a single one so I will just be a bit verbose. From the very beginning I was interested in improving the package manager experience and the UI on top for our users. I m proud of the work I did with synaptic. It was one of the earliest UIs on top of apt. Because of my work on synaptic I got into apt development as well and fixed bugs there and added new features. I still do most of the uploads here, but nowadays David Kalnischkies is the most active developer. I also wrote a bunch of tools like gdebi, update-notifier, update-manager, unattended-upgrade and software-properties to make the update/install situation for the user easier to deal with. Most of the tools are written in python so I added a lot of improvements to python-apt along the way, including the initial high level apt interface and a bunch of missing low-level apt_pkg features. Julian Andres Klode made a big push in this area recently and thanks to his effort the bindings are fully complete now and have good documentation. My most recent project is software-center. Its aim is to provide a UI strongly targeted for end-users. The goal of this project is to make finding and installing software easy and beautiful. We have a fantastic collection of software to offer and software-center tries to present it well (including screenshots, instant search results and soon ratings&reviews). This builds on great foundations like aptdaemon by Sebastian Heinlein, screenshots.debian.net by Christoph Haas, ddtp.debian.org by Michael Bramer, apt-xapian-index by Enrico Zini and many others (this is what I love about free software, it usually adds , rarely takes away ). What are your plans for Debian Wheezy? For apt I would love to see a more plugable architecture for the acquire system. It would be nice to be able to make apt-get update (and the frontends that use this from libapt) be able to download additional data (like debtags or additional index file that contains more end-user targeted information). I also want to add some scripts so that apt (optionally) creates btrfs snapshots on upgrade and provide some easy way to rollback in case of problems. There is also some interesting work going on around making the apt problem resolver a more plugable part. This way we should be able to do much faster development. software-center will get ratings&reviews in the upstream branch, I really hope we can get that into Wheezy. If you could spend all your time on Debian, what would you work on? In that case I would start with a refactor of apt to make it more robust about ABI breaks. It would be possible to move much faster once this problem is solved (its not even hard, it just need to be done). Then I would add a more complete testsuite. Another important problem to tackle is to make maintainer scripts more declarative. I triaged a lot of upgrade bug reports (mostly in ubuntu though) and a lot of them are caused by maintainer script failures. Worse is that depending on the error its really hard for the user to solve the problem. There is also a lot of code duplication. Having a central place that contains well tested code to do these jobs would be more robust. Triggers help us a lot here already, but I think there is still more room for improvement. What s the biggest problem of Debian? That s a hard question :) I mostly like Debian the way it is. What frustrated me in the past were flamewars that could have been avoided. To me being respectful to each other is important, I don t like flames and insults because I like solving problems and fighting like this rarely helps that. The other attitude I don t like is to blame people and complain instead of trying to help and be positive (the difference between it sucks because it does not support $foo instead of it would be so helpful if we had $foo because it enables me to let me do $bar ). For a long time, I had the feeling you were mostly alone working on APT and were just ensuring that it keeps working. Did you also had this feeling and are things better nowadays ? I felt a bit alone sometimes :) That being said, there were great people like Eugene V. Lyubimkin and Otavio Salvador during my time who did do a lot of good work (especially at release crunch times) and helped me with the maintenance (but got interested in other area than apt later). And now we have the unstoppable David Kalnischkies and Julian Andres Klode. Apt is too big for a single person, so I m very happy that especially David is doing superb work on the day-to-day tasks and fixes (plus big project like multiarch and the important but not very thankful testsuite work). We talk about apt stuff almost daily, doing code reviews and discuss bugs. This makes the development process much more fun and healthy. Julian Andres Klode is doing interesting work around making the resolver more plugable and Christian Perrier is as tireless as always when it comes to the translations merging. I did a quick grep over the bzr log output (including all branch merges) and count around ~4300 total commits (including all revisions of branches merged). Of that there ~950 commits from me plus an additional ~500 merges. It was more than just ensuring that it keeps working but I can see where this feeling comes from as I was never very verbose. Apt also was never my only project, I am involved in other upstream work like synaptic or update-manager or python-apt etc). This naturally reduced the time available to hack on apt and spend time doing the important day-to-day bug triage, response to mailing list messages etc. One the python-apt side Julian Andres Klode did great work to improve the code and the documentation. It s a really nice interface and if you need to do anything related to packages and love python I encourage you to try it. Its as simple as:
import apt
cache = apt.Cache()
cache["update-manager"].mark_install()
cache.commit()
Of course you can do much more with it (update-manager, software-center and lots of more tools use it). With pydoc apt you can get a good overview. The apt team always welcomes contributors. We have a mailing list and a irc channel and it s a great opportunity to solve real world problems. It does not matter if you want to help triage bugs or write documentation or write code, we welcome all contributors. You re also an Ubuntu developer employed by Canonical. Are you satisfied with the level of cooperation between both projects? What can we do to get Ubuntu to package new applications developed by Canonical directly in Debian? Again a tricky question :) When it comes to cooperation there is always room for improvement. I think (with my Canonical hat on) we do a lot better than we did in the past. And it s great to see the current DPL coming to Ubuntu events and talking about ways to improve the collaboration. One area that I feel that Debian would benefit is to be more positive about NMUs and shared source repositories (collab-maint and LowThresholdNmu are good steps here). The lower the cost is to push a patch/fix (e.g. via direct commit or upload) the more there will be. When it comes to getting packages into Debian I think the best solution is to have a person in Debian as a point of contact to help with that. Usually the amount of work is pretty small as the software will have a debian/* dir already with useful stuff in it. But it helps me a lot to have someone doing the Debian uploads, responding to the bugmail etc (even if the bugmail is just forwarded as upstream bugreports :) IMO it is a great opportunity especially for new packagers as they will not have to do a lot of packaging work to get those apps into Debian. This model works very well for me for e.g. gdebi (where Luca Falavigna is really helpful on the Debian side). Is there someone in Debian that you admire for his contributions? There are many people I admire. Probably too many to mention them all. I always find it hard to single out individual people because the project as a whole can be so proud of their achievements. The first name that comes to my mind is Jason Gunthorpe (the original apt author) who I ve never met. The next is Daniel Burrows who I met and was inspired by. David Kalnischkies is doing great work on apt. From contributing his first (small) patch to being able to virtually fix any problem and adding big features like multiarch support in about a year. Sebastian Heinlein for aptdaemon. Christian Perrier has always be one of my heroes because he cares so much about i18n. Christoph Haas for screenshots.debian.net, Michael Bramer for his work on debian translated package descriptions.
Thank you to Michael for the time spent answering my questions. I hope you enjoyed reading his answers as I did. Subscribe to my newsletter to get my monthly summary of the Debian/Ubuntu news and to not miss further interviews. You can also follow along on Identi.ca, Twitter and Facebook.

4 comments Liked this article? Click here. My blog is Flattr-enabled.

6 January 2011

Eugene V. Lyubimkin: cupt: 2.0.0~alpha3

Notable changes since 2.0.0~alpha2:

Full changelog is available in debian/changelog, as usual.

binary packages for i386, source package and README

10 December 2010

Rapha&#235;l Hertzog: People behind Debian: David Kalnischkies, an APT developer

The two first interviews were dedicated to long-time Debian developers. This time I took the opposite approach, I interviewed David Kalnischkies who is not (yet) a Debian developer. But he s contributing to one of the most important software within Debian the APT package manager since 2009. You can already see him in many places in Debian sharing his APT knowledge when needed. English is not his native language and he s a bit shy, but he accepted the interview nevertheless. I would like to thank him for the efforts involved and I hope his story can inspire some people to take the leap and just start helping My questions are in bold, the rest is by David. Who are you? I am David Kalnischkies, 22 years old, living in the small town Erbach near Wiesbaden in Germany and I m studying computer science at the TU Darmstadt. Furthermore I am for more than half a decade now young group leader of my hometown. I never intended to get into this position, but it has similarities with my career in this internet-thingy here. I don t remember why, but in April 2009 I was at a stage that some simple bugs in APT annoyed me so much that I grabbed the source, and most importantly I don t know why I did it but I published my changes in Mai with #433007, a few more bugs and even a branch on launchpad. And this public branch got me into all this trouble in June: I got a mail from Mr. package managment Michael Vogt regarding this branch A few days later I joined an IRC session with him and closely after that my name appeared for the first time in a changelog entry. It s a strange but also addicting feeling to read your own name in an unfamiliar place. And even now after many IRC discussions, bugfixes and features, three Ubuntu Developer Summits and a Google Summer of Code in Debian, my name still appear in places I have never even thought about e.g. in an interview. What s your biggest achievement within Debian? I would like to answer MultiArch in APT as it was my Google Summer of Code project, but as it has (not much) use for the normal user at this point will hopefully change for wheezy I chose three smaller things in squeeze s APT that many people don t even know yet: If your impression is now that I only do APT stuff: that s completely right, but that s already more than enough for me for now as the toolchain behind the short name APT contains so many tools and use cases that you always have something different. You re an active member of the APT development team. Are there plans for APT in Debian Wheezy? What features can we expect? That s very hard to answer, as the team is too small to be able to really plan something. I mean, you can have fancy plans and everything and half a second later someone arrives on the mailing list with a small question which eats days of development time just for debugging But right now the TODO list contains (in no particular order): We will see what will get real for wheezy and what is postponed, but one thing is sure: more will be done for wheezy if you help! If you could spend all your time on Debian, what would you work on? I would spend it on APT s debbugs count zero would be cool to look at! We make progress in this regard, but with the current velocity we will reach it in ten years or so. Reading more mailing lists would be interesting, as I am kind of an information junky. Maintaining a package could be interesting to share the annoyance of a maintainer with handcrafted dependencies just to notice that APT doesn t get it in the way I intended it to be. Through, to make it feel real I need to train a few new APT contributors before so they can point my mistake out, but this unfortunately doesn t depend so much on time but on victims Maybe I could even be working on getting an official status. Beside that, I would love to be able to apt-get dist-upgrade the increasing mass of systems I and many others carry around in their pockets. In regards to my phone, this is already fixed, but there is much room for improvements. What s the biggest problem of Debian? You need to be lucky. You need to talk at the right time to the right person. That s not really a debian-only problem as such, but in a global project full of volunteers you can see it clearly as there are plenty of opportunities to be unlucky. For example, it s unlikely that an interview would be made with me now if Michael had not contacted me in June 2009. In a big project like Debian, you are basically completely lost without a mentor guiding you, so things like the debian-mentors list are good projects, but I am pretty certain they could benefit from some more helping hands. The other thing which I consider a problem is that and I read from time to time some people don t care for translations. That s bad. Yes, a developer is able to read English, otherwise s/he couldn t write code or participate on the mailinglists. Still, I personally prefer to use a translated application if I have the chance as it s simply easier for me to read in my mother tongue, not only because I am dyslexic, but because my mind still thinks in German and not in English. Yes, I could personally fix that by thinking in English only from now on, but its a quite big problem to convince my family which is not really familiar with tech-stuff to use something if they can t understand what is written on screen. It was hard enough to tell my mother how to write an SMS in a German interface. My phone with English words all over the place would be completely unusable for her despite the fact that my phone is powered by Debian and better for the task from a technical point of view. You are not yet an official Debian developer/maintainer, but you re already perceived in the community as one the most knowledgeable person about APT. It s a great start! What s your advice to other people who want to start contributing to Debian in general, and to APT in particular? It was never a goal in my life to start contributing . My goal was and still is to make my life easier by letting the computer work for me. At some point APT hindered the success of this goal, so it needed to be fixed. I didn t expect to open pandora s box. So, my advice is simple: Just start. Ignore the warning signs telling you that this is not easy. They are just telling you that you do something useful. Only artificial problems are easy. Further more, contribution to APT, dpkg or any other existing package is in no way harder than opening an ITP and working on your own, and it s cooler as you have a similar minded team around you to talk to. :) APT didn t accept release codenames as target release was one of the first things I fixed. If I had asked someone if that would be a good starting point the answer would have been a clear no , but I didn t search for a good starting point As a kid I can start playing football by just walking on the field and play or I can sit near the field, watching the others play, while analyzing which position would be the best for me to start ruling out one by one as the technical requirements seem too high Oh bicycle kick that sounds complicated I can t do that Julian Andreas Klode is working on a APT replacement, there s also Cupt by Eugene V. Lyubimkin. Both projects started because their authors are not satisfied with APT, they find APT s code difficult to hack partly due to the usage of C++. Do you share their concerns and what s your opinion on those projects? I don t think C++ is a concern in this regard, after all cupt is currently rewritten to C++0x and APT2 started in vala and is now C + glib last time I checked at least. I personally think that something is wrong if we need to advertise an application by saying in which language it is written The major problem for APT is probably that the code is old : APT does its job for more than 12 years now, under different maintainers with an always changing environment around it: so there are lines in APT which date from a time when nobody knew what a Breaks dependency is, that packages can have long descriptions which can be translated or even that package archives can be signed with a gpg key! And yet we take all those for granted today. APT has proven to adapt to these changes in the environment and became in this process very popular. So I don t think the point is near (if it will come at all) that APT can go into retirement as it is completely replaced by something else. The competitors one the other hand have their first 12 years still to go. And it will be interesting to see how they will evolve and what will be the state of the art in 2022 But you asked what I think about the competitors: I prefer the revolution from inside simply because I can see effects faster as more users will profit from it now. Cupt and co. obviously prefer the normal revolution. The goal is the same, creating the best package manager tool, but the chosen way to the goal is different. aptitude and cupt have an interactive resolver for example: that s something I dislike personally, for others that is the ultimate killer feature. cupt reading the same preference file as APT will have a different pinning result, which we should consider each time someone mentions the word drop-in replacement . APT2 isn t much more than the name which I completely dislike currently from a user point of view, so I can t really comment on that. All of them make me sad as each line invested in boilerplate code like configuration file parsing would be in my eyes better be spent in a bugfix or new feature instead, but I am not here to tell anyone what they should do in their free time But frankly, I don t see them really as competitors: I use the tools I use, if other do that too that s good, if not that s their problem. :) The thing that annoys me really are claims like plan is to remove APT by 2014 as this generates a vi vs. emacs like atmosphere we don t need. If some people really think emacs is a good editor who cares? I really hope we all can drink a beer in 2022 in Milliways, the restaurant at the end of the package universe, remembering the good old 2010 ;) Is there someone in Debian that you admire for his contributions? No, not one, many! Michael Vogt who has nearly the monopole of package manager maintainer by being upstream of APT, synaptics and software center to name only the biggest and still has the time to answer even the dumbest of my questions. :) Jason Gunthorpe for being one of the initial developers behind deity who I will probably never meet in person beside in old comments and commit logs. Christian Perrier for caring so much about translations. Obey Arthur Liu as a great admin for Debian s participation in Google s Summer of Code. Paul Wise for doing countless reviews on debian-mentors which are a good source of information not only for the maintainer of the package under review. I guess I need to stop here because you asked for just one. So let s end with some big words instead: I am just a little cog in the big debian wheel
Thank you to David Kalnischkies for the time spent answering my questions. I hope you enjoyed reading his answers as I did. Subscribe to my newsletter to get my monthly summary of the Debian/Ubuntu news and to not miss further interviews. You can also follow along on Identi.ca, Twitter and Facebook.

3 comments Liked this article? Click here. My blog is Flattr-enabled.

14 November 2010

Eugene V. Lyubimkin: cupt: 2.0.0~alpha2

The second alpha of Cupt2 is out at the same download place.

Changes since first alpha:
The library API is still a subject to change. If you care, please install the documentation package and mail me (see mail address in README) your thoughts and suggestions.

26 September 2010

Eugene V. Lyubimkin: cupt: 2.0.0~alpha1

First alpha version of Cupt version 2 is available here: http://people.debian.org/~jackyf/cupt2/.

Notable changes since 1.y series:
- rewrite in C++(0x)
- improved speed;
- improved RAM usage;
- less dependencies.

All other technical details, including not yet implemented things, are described in http://people.debian.org/~jackyf/cupt2/README.

8 July 2010

Eugene V. Lyubimkin: bits about cupt

1.5.x branch

This is the branch of Cupt I intend to put into Squeeze, I fix rare bugs from time to time there. Today I uploaded the version with the support of the architecture wildcards in source packages, defined in recently released Debian Policy 3.9.0.

2.0 branch
In the same time, I started a rewrite of the whole thing to C++ to make Cupt not only (as possible) feature-rich and bug-free, but also fast, following the example of Git DVCS. I have a significant chance to finish this task before Squeeze+1 release.

18 April 2010

Eugene V. Lyubimkin: dealing with unwanted recommends using cupt

Some people use '--without-recommends' switch and the problem is gone. I think that's overkill.

This post will tell how (would I recommend) to deal with unwanted recommends. The following text tells about cupt package manager, and though ideas are applicable to any package managers, the techniques may differ.


Stage 1. Facing problems

Suppose we want to install the package kde-minimal:

-8<-
$ sudo cupt install kde-minimal
[sudo] password for jackyf:
Building the package cache... [done]
Initializing package resolver and worker... [done]
Scheduling requested actions... [done]
Resolving possible unmet dependencies...
The following 16 packages will be INSTALLED:

dolphin kappfinder kde-minimal kdebase-apps kdebase-bin kdebase-data kdepasswd
kfind kinfocenter konqueror konqueror-nsplugins kwrite libkonq5
libkonq5-templates libkonqsidebarplugin4 plasma-widget-folderview

The following 1 packages will be UPGRADED:

konsole

Need to get 6024KiB/6024KiB of archives. After unpacking 14.9MiB will be used.
Do you want to continue? [y/N/q]
->8-

Umm, kinfocenter? No, we don't want it to be installed.

Stage 2. Searching for a reason

Now we will use the 'reason tracking' feature of libcupt's resolver. Let's add '-s' switch (simulating) and '-D' switch (tracking reasons):

-8<-
$ cupt -s install kde-minimal -D
Building the package cache... [done]
Initializing package resolver and worker... [done]
Scheduling requested actions... [done]
Resolving possible unmet dependencies...
The following 16 packages will be INSTALLED:

dolphin
reason: kdebase-apps 4:4.3.4-1 depends on 'dolphin (>= 4:4.3.4-1)'

kappfinder
reason: kdebase-apps 4:4.3.4-1 depends on 'kappfinder (>= 4:4.3.4-1)'

kde-minimal
reason: user request

kdebase-apps
reason: kde-minimal 5:55 depends on 'kdebase-apps (>= 4:4.3.1)'

kdebase-bin
reason: kdebase-apps 4:4.3.4-1 depends on 'kdebase-bin (>= 4:4.3.4-1)'

kdebase-data
reason: kappfinder 4:4.3.4-1 depends on 'kdebase-data (= 4:4.3.4-1)'

kdepasswd
reason: kdebase-apps 4:4.3.4-1 depends on 'kdepasswd (>= 4:4.3.4-1)'

kfind
reason: kdebase-apps 4:4.3.4-1 depends on 'kfind (>= 4:4.3.4-1)'

kinfocenter
reason: kdebase-apps 4:4.3.4-1 recommends 'kinfocenter (>= 4:4.3.4-1)'

konqueror
reason: kdebase-apps 4:4.3.4-1 depends on 'konqueror (>= 4:4.3.4-1)'

konqueror-nsplugins
reason: kdebase-apps 4:4.3.4-1 recommends 'konqueror-nsplugins (>= 4:4.3.4-1)'

kwrite
reason: kdebase-apps 4:4.3.4-1 depends on 'kwrite (>= 4:4.3.4-1)'

libkonq5
reason: kdebase-apps 4:4.3.4-1 depends on 'libkonq5 (>= 4:4.3.4-1)'

libkonq5-templates
reason: libkonq5 4:4.3.4-1 depends on 'libkonq5-templates kdesktop'

libkonqsidebarplugin4
reason: konqueror 4:4.3.4-1 depends on 'libkonqsidebarplugin4 (>= 4:4.3.4)'

plasma-widget-folderview
reason: kdebase-apps 4:4.3.4-1 depends on 'plasma-widget-folderview (>= 4:4.3.4-1)'


The following 1 packages will be UPGRADED:

konsole
reason: synchronized with package 'kdebase-apps'


Need to get 6024KiB/6024KiB of archives. After unpacking 14.9MiB will be used.
Do you want to continue? [y/N/q]
->8-

Now, the output is much bigger. Every package has reason(s) to change its state appended. Let's look at the chain: kinfocenter is installed, because... kdebase-apps is to install. Ok, look further, kdebase-apps is to install because... of kde-minimal. The final chain is 'kde-minimal -> kdebase-apps -> kinfocenter'.

Stage 2.5. Analyzing reason

After determining a full chain one may try to think - is this dependency chain looks OK? Maybe, some Depends/Recommends should be turned up to a less strict dependencies? If that so, it's a good idea to fire a command 'reportbug <package>' to report an unwanted dependency.

In this example, the dependency chain looks OK (at least, to me :)).

Stage 3. Installing a package without unwanted dependency(ies)

The resolution of the problem depends on the answer to the question "Why I don't want this package to be installed?"

Case 1. Only this time/package

If the answer is, say, "Because dependency 'kdebase-apps -> kinfocenter' doesn't make sense on this system", then you do:

-8<-
$ sudo cupt install kde-minimal kinfocenter-
[sudo] password for jackyf:
Building the package cache... [done]
Initializing package resolver and worker... [done]
Scheduling requested actions... [done]
Resolving possible unmet dependencies...
The following 15 packages will be INSTALLED:

dolphin kappfinder kde-minimal kdebase-apps kdebase-bin kdebase-data kdepasswd kfind konqueror konqueror-nsplugins kwrite libkonq5 libkonq5-templates libkonqsidebarplugin4 plasma-widget-folderview

The following 1 packages will be UPGRADED:

konsole

Need to get 5463KiB/5463KiB of archives. After unpacking 13.2MiB will be used.
Do you want to continue? [y/N/q]
->8-

And that's all, kinfocenter won't be installed.

Consequences:

1. This answer implies that you are not against installing kinfocenter if some other package needs it. So, while installing other packages which recommend kinfocenter cupt will suggest you to install kinfocenter again.

2. If you will want to upgrade the package, and the new version contains the same relation, cupt will not suggest to install kinfocenter.

Case 2. Forever

If the answer is "Because I don't want to see kinfocenter in my system ever", then the good choice is APT pinning system.

Place the next three lines into your APT preferences file (usually, /etc/apt/preferences or any file in /etc/apt/preferences.d directory):

-8<-
Package: kinfocenter
Pin: version *
Pin-Priority: -10000
->8-

-10000 is a very low priority, usual priorities are (0 - 1000), so cupt will not ever suggest you to install kinfocenter because of soft (Recommends or Suggests) dependencies. Moreover, if in some actions cupt will need to choose between, say 'kde-virtual kde-virtual-alternate', where kde-virtual depends on kinfocenter and kde-virtual-alternate does not, then cupt will choose kde-virtual-alternate. The exceptions possible in (very unlikely to happen) cases, when non-installing kinfocenter will result in very bad solution for selected actions. The less priority kinfocenter has, the less chances it has to be installed when not requested explicitly (as in 'cupt install kinfocenter').

9 April 2010

David Welton: US Exports: "The Cloud"?

An Economist special report in last week's print edition talks about how the US will need focus more on savings and exports:

A special report on America's economy: Time to rebalance I've been thinking about that for a while too, especially after the dollar's recent weakness, although it has been strengthening some, lately, apparently due to the state of Greece's finances... I think that the computing industry is, in general, well poised to take advantage of that. For instance, what could be easier to export than computing power or "Software as a Service"? All it takes is a few minutes for someone in Europe to sign up to a US-based service with a credit card. For instance, compare Linode's prices and good service with most of their European competitors (gandi.net for instance, who are good people, and you have to love that they put "no bullshit" right on their front page). Not that they don't have good service in Europe, but it's very difficult to compete on price with the dollar being significantly cheaper. With the dollar where it is right now, gandi is almost, but not quite, competitive with Linode. If you don't include taxes. If the dollar weakens again, though, things could easily tilt far in Linode's favor. Besides a weak dollar, I think it will be important for companies in a position to do so in the US to focus on "the rest of the world". The US is a big, populous country where it's very easy to forget about far-off lands. Compare my home town of Eugene, Oregon to where I live in Padova. Google Maps says that it takes 7+ hours to drive to Vancouver, Canada (which, to tell the truth, isn't all that foreign in that they speak English with an accent much closer to mine than say, Alabama or Maine). Going south, Google says it's 15+ hours just to San Diego, although I think that's optimistic myself, given traffic in California. From Padova, I can be in France in 5 hours, according to Google, 3 hours to Switzerland, 4 to Innsbruck, in Austria, less than 3 hours to the capital of Slovenia, Ljubljana, and around 3 hours to Croatia, too. And if you wanted to throw in another country, the Republic of San Marino is also less than 3 hours away, according to Google's driving time estimates. You could probably live your entire life in a place like Eugene and never really deal much with foreigners, whereas here, nearby borders are both a historic and an ever-present fact. The outcome of this is that, to some degree, people in the US have traditionally focused their businesses "inwards" until they got to a certain size. Which is, of course, a natural thing to do when you have such a big, homogenous market to deal with before you even start thinking about foreign languages, different laws, exchange rates and all the hassle those things entail. However, if exchange rates hold steady or favor the US further, and internal spending remains weaker, it appears as if it may be sensible for companies to invest some time and energy to attract clients in "the rest of the world". "Cloud" (anyone got a better term? this one's awfully vague, but I want to encompass both "computing power" like Linode or Amazon's EC2, as well as "software as a service") companies likely will have a much easier time of things: for many services, it's easy to just keep running things in the US for a while, and worry about having physical or legal infrastructure abroad later. Your service might not be quite as snappy as it may be with a local server, but it'll do, if it performs a useful function. Compare that with a more traditional business where you might have to do something like open a factory abroad, or at the very least figure out the details of how to ship physical products abroad and sell them, and do so in a way that you're somewhat insured against the large array of things that could go wrong between sending your products on their merry way, and someone buying them in Oslo, Lisbon or Prague. Since this barrier to entry is lower, it makes more sense to climb over it earlier on. As an example, Linode recently did a deal to provide VPS services from a London data center, to make their service more attractive to European customers. However, they still don't appear have marketing materials translated into various languages, and presumably they don't have support staff capable of speaking languages like Chinese, German or Russian either (well, at least not in an official capacity). This isn't to pick on them; they may have considered those things and found them too much of an expense/distraction/hassle for the time being - they certainly know their business better than I do - and that they simply are content to make do with English. Other businesses, however, may decide that a local touch is important to attracting clients. What do you need to look at to make your service more attractive to people in other countries? In no particular order:

There's certainly no lack of work there, but on the other hand, it's possible to do almost all of it from wherever you happen to be located, rather than spending lots of money and time flying around to remote corners of the globe, as is still common practice in many industries.

2 January 2010

Eugene V. Lyubimkin: cupt 1.5

Today, since a year of the first commit to the git repository, I released Cupt 1.5 and uploaded it to Debian unstable.

Notable changes since the last blog post about cupt 1.2:

1.3: various speed and RAM usage improvements
1.5: priorities for download protocols and download methods, download methods are now pluggable; bash completion for 'cupt' program

Have fun :)

17 December 2009

Eugene V. Lyubimkin: a small script to satisfy build-dependencies from .dsc

Here it is: http://paste.debian.net/54275/. It needs /usr/bin/cupt from testing or unstable to work. Please report bugs or suggestions in comments to this post or anywhere where you can reach me.

Typical usage:

1) './script package.dsc -s' - preview
2) 'sudo ./script package.dsc' - do the work

Type './script' to see usage.

I will move it to more permanent storage soon.
UPD: Moved. http://people.debian.org/~jackyf/satisfy-dsc-deps.

13 December 2009

Julian Andres Klode: APT2 progress report for the 1st half of December


This week was successful. I have pushed some changes from November to the repository which change the license to LGPL-2.1+ (which makes bi-directional sharing of code with other projects easier, since most Vala code is under the same license) and implement HTTP using libsoup2.4 directly, instead of using GIO and GVFS for this. I also added a parser for the sources.list format which uses regular expressions to parse the file and is relatively fast. The code needs a current checkout of Vala s git master to work correctly; as released versions had a bug which I noticed today and J rg Billeter fixed in Vala 25 minutes later; thank you J rg. While nothing else happened in the public repository, the internal branch has seen a lot of new code; including SQLite 3 caches; Acquire text progress handling; and capt; the command-line advanced package tool. Most of the code will need to be reworked before it will be published, but I hope to have this completed until Christmas. It will also depend on Vala 0.7.9 or newer, which is yet to be released. The decision to use SQLite 3 as a backend means that we won t see the size limitations APT has and that development can be simplified by using SQL queries for filtering requests. It also means that APT2 will be very fast in most actions, like searching; which currently happens in 0.140 seconds (unstable,experimental and more repositories enabled), whereas aptitude takes 1.101 seconds, cupt (which has no on-disk cache) 1.292 seconds, and apt-cache 0.475 seconds. Searching is performed by one SQL query. I also want to thank Jens Georg <mail@jensge.org>, who wrote Rygel s Database class which is also used with minor modifications (like defaulting to in-memory journals) in APT2 as well. Rygel.Database is a small wrapper around sqlite3 which makes it easier to program for Vala programmers. The command-line application capt provides a shell based on readline with history (and later on command completion) as well as direct usage like capt config dump or capt search python-apt . Just as with Eugene s cupt, capt will be the only program in the core APT2 distribution and provide the same functionality currently provided by apt-get, apt-config and friends. The name is not perfect and can be easily confused with cupt , but it was the closest option for now; considering that the name apt is already used by Java (for its Annotation Processing Tool ). That s all for now, I ll tell you once all those features have passed my QA, and there is really something usable in the repository. In the meanwhile, you can discuss external dependency solvers, database layouts and other stuff in their threads on deity@lists.debian.org. And a screenshot from capt:
jak@hp:~/Desktop/APT2:temp$ capt
apt$ help
APT2 0.0.20091213 command-line frontend
Commands:
  config dump               Dump the configuration
  config get OPTION         Get the given option
  config set OPTION VALUE   Set the given option
  search EXPRESSION         Search for the given expression
  show PACKAGE              Show all versions of the given package
  sources list              Print a list of all sources
  version                   Print the version of APT2
apt$ search python-apt
build-depends-python-apt - Dummy package to fulfill package dependencies
python-apt - Python interface to libapt-pkg
python-apt-dbg - Python interface to libapt-pkg (debug extension)
python-apt-dev - Python interface to libapt-pkg (development files)
python-aptdaemon - Python module for the server and client of aptdaemon
python-aptdaemon-gtk - Python GTK+ widgets to run an aptdaemon client
apt$
Posted in APT2

4 November 2009

Eugene V. Lyubimkin: cupt-standalone

Some people worried that cupt has a lot of dependencies, so it's uninstallable on very limited systems. I probably have good news for them:


$ dpkg -l grep perl
ii liblocale-gettext-perl 1.05-4 Using libc functions for internationalizatio
ii perl-base 5.10.1-6 minimal Perl system

$ ./cupt-compiled-i386 -s full-upgrade
Building the package cache... [done]
Initializing package resolver and worker... [done]
Scheduling requested actions... [done]
Resolving possible unmet dependencies...
The following 15 packages will be INSTALLED:

dash gnupg-curl insserv install-info libc-bin libc6-i686 libdb4.7 [...]

The following 118 packages will be UPGRADED:

apt apt-utils base-files base-passwd bash bsdmainutils coreutils cpio [...]

Need to get 48.5MiB/56.0MiB of archives. After unpacking 46.7MiB will be used.
Do you want to continue? [y/N/q] q

$ wc -c ./cupt-compiled-i386
3584686 ./cupt-compiled-i386


This is a part of the log from my i386 chroot.

The only extra dependency for the binary is 'libcurl3-gnutls' for downloading from http/https/ftp. It does not need even perl-base to work, but I can't remove perl-base from the system as it's essential.

PAR rocks.

Next.

Previous.